Results on Learnability and the Vapnik-Chervonenkis Dimension*
نویسنده
چکیده
We consider the problem of learning a concept from examples in the distributionfree model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examples examined may increase with the complexity of the target concept. This method is used to establish the learnability of various concept classes with an infinite VapnikChervonenkis dimension. We also discuss an important variation on the problem of learning from examples, called approximating from examples. Here we do nor assume that the target concept T is a member of the concept class %? from which approximations are chosen. This problem takes on particular interest when the VC dimension of V is infinite. Finally, we discuss the problem of computing the VC dimension of a finite concept set defined on a tinite domain and consider the structure of classes of a fixed small dimension.
منابع مشابه
September 1989 LIDS - P - 1910 On Metric Entropy , Vapnik - Chervonenkis Dimension , and Learnability for a Class of Distributions 1
In [23], Valiant proposed a formal framework for distribution-free concept learning which has generated a great deal of interest. A fundamental result regarding this framework was proved by Blumer et al. [6] characterizing those concept classes which are learnable in terms of their Vapnik-Chervonenkis (VC) dimension. More recently, Benedek and Itai [4] studied learnability with respect to a fix...
متن کاملRelating Data Compression and Learnability
We explore the learnability of two-valued functions from samples using the paradigm of Data Compression. A first algorithm (compression) choses a small subset of the sample which is called the kernel. A second algorithm predicts future values of the function from the kernel, i.e. the algorithm acts as an hypothesis for the function to be learned. The second algorithm must be able to reconstruct...
متن کاملResults on learnability and the Vapnik-Chervonenkis dimension (Extended Abstract)
We consider the problem of learning a concept from examples in the distributionfree model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examples examined may increase with the complexity of the target concept. This method is used to estab...
متن کاملGeneralization of Elman Networks
The Vapnik Chervonenkis dimension of Elman networks is innnite. Here, we nd constructions leading to lower bounds for the fat shattering dimension that are linear resp. of order log 2 in the input length even in the case of limited weights and inputs. Since niteness of this magnitude is equivalent to learnability, there is no a priori guarantee for the generalization capability of Elman networks.
متن کاملRIFIS Technical Report Complexity of Computing Generalized VC-Dimensions
In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of binary functions. For a class of multi-valued functions, the notion has been generalized in various ways. This paper investigates the complexity of computing some of generalized VC-dimensions: VC*dimension, *,-dimension, and SG-dimension. For each dim...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003